An interior proximal method in vector optimization
نویسندگان
چکیده
This paper studies the vector optimization problem of finding weakly efficient points for maps from R to R, with respect to the partial order induced by a closed, convex, and pointed cone C ⊂ R, with nonempty interior. We develop for this problem an extension of the proximal point method for scalar-valued convex optimization problem with a modified convergence sensing conditon that allows us to construct an interior proximal method for solving POV on nonpolyhedral set.
منابع مشابه
Interior proximal algorithm with variable metric for second-order cone programming: applications to structural optimization and support vector machines
In this work, we propose an inexact interior proximal type algorithm for solving convex second-order cone programs. This kind of problems consists of minimizing a convex function (possibly nonsmooth) over the intersection of an affine linear space with the Cartesian product of second-order cones. The proposed algorithm uses a distance variable metric, which is induced by a class of positive def...
متن کاملA full Nesterov-Todd step interior-point method for circular cone optimization
In this paper, we present a full Newton step feasible interior-pointmethod for circular cone optimization by using Euclidean Jordanalgebra. The search direction is based on the Nesterov-Todd scalingscheme, and only full-Newton step is used at each iteration.Furthermore, we derive the iteration bound that coincides with thecurrently best known iteration bound for small-update methods.
متن کاملInterior Proximal Method for Variational Inequalities: Case of Non-paramonotone Operators
For variational inequalities characterizing saddle points of Lagragians associated with convex programming problems in Hilbert spaces, the convergence of an interior proximal method based on Bregman distance functionals is studied. The convergence results admit a successive approximation of the variational inequality and an inexact treatment of the proximal iterations.
متن کاملConvex optimization techniques for the efficient recovery of a sparsely corrupted low-rank matrix
We address the problem of recovering a low-rank matrix that has a small fraction of its entries arbitrarily corrupted. This problem is recently attracting attention as nontrivial extension of the classical PCA (principal component analysis) problem with applications in image processing and model/system identification. It was shown that the problem can be solved via a convex optimization formula...
متن کاملSmoothing proximal gradient method for general structured sparse regression
We study the problem of estimating high dimensional regression models regularized by a structured-sparsity-inducing penalty that encodes prior structural information on either input or output sides. We consider two widely adopted types of such penalties as our motivating examples: 1) overlapping-group-lasso penalty, based on the l1/l2 mixed-norm penalty, and 2) graph-guided fusion penalty. For ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- European Journal of Operational Research
دوره 214 شماره
صفحات -
تاریخ انتشار 2011